Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Transfer learning based hierarchical attention neural network for sentiment analysis
QU Zhaowei, WANG Yuan, WANG Xiaoru
Journal of Computer Applications    2018, 38 (11): 3053-3056.   DOI: 10.11772/j.issn.1001-9081.2018041363
Abstract984)      PDF (759KB)(838)       Save
The purpose of document-level sentiment analysis is to predict users' sentiment expressed in the document. Traditional neural network-based methods rely on unsupervised word vectors. However, the unsupervised word vectors cannot exactly represent the contextual relationship of context and understand the context. Recurrent Neural Network (RNN) generally used to process sentiment analysis problems has complex structure and numerous model parameters. To address the above issues, a Transfer Learning based Hierarchical Attention Neural Network (TLHANN) was proposed. Firstly, an encoder was trained to understand the context with machine translation task for generating hidden vectors. Then, the encoder was transferred to sentiment analysis task by concatenating the hidden vector generated by the encoder with the corresponding unsupervised vector. The contextual relationship of context could be better represented by distributed representation. Finally, a two-level hierarchical network was applied to sentiment analysis task. A simplified RNN unit called Minimal Gate Unit (MGU) was arranged at each level leading to fewer parameters. The attention mechanism was used in the model for extracting important information. The experimental results show that, the accuracy of the proposed algorithm is increased by an avervage of 8.7% and 23.4% compared with the traditional neural network algorithm and Support Vector Machine (SVM).
Reference | Related Articles | Metrics